Culture Question:

Is the Western culture like in films? Are all the films true?

(posted: Oct 2009)

Is Western culture like in films? That depends what films you watch. Dramas or comedies from the West will give you a feeling for what life is like for some people in the West. But you must remember that films will exaggerate in order to create more drama, or more laughs, and they can only ever give you slices of Western life, not the whole pie.

Movies are made in order to make money, and movies with a lot of action (action adventure and crime thrillers) tend to make a lot of money. It would be a mistake to think that they represent what is normal in the West though. Car chases and gun fights do not happen as a regular part of most Americans’ lives.

Are all films true? Well, the simplest answer is to say no, that all films are in fact not true, they are made up by the filmmakers, even if they are based on true stories. But life, and films, are not simple things. There is some truth even in things that are not, strictly speaking, ‘true’. Pablo Picasso, a famous artist in the West, once said “Art is a lie that makes us realize truth.” Now, there are many films that are not Art with a capital ‘A’; they are films made just as entertainment. But from some films we can see reflections of the culture that made them and reflections of what it is to be human.

Source:

  1. ThinkExist. ‘Pablo Picasso Quotes’,  http://thinkexist.com/quotation/ art_is_a_lie_that_makes_us_realize_truth /143234.html  (07-Oct-2009)

Return to Culture Questions page

   -  top of page -

 

Movie poster mosaic